Risk bounds for regularized least-squares algorithm with operator-valued kernels

نویسندگان

  • Ernesto De Vito
  • Andrea Caponnetto
چکیده

We show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vectorvalued regression setting. We first briefly introduce central concepts on operatorvalued kernels, then we show how risk bounds can be expressed in terms of a generalization of effective dimension. This report describes research done at the Center for Biological & Computational Learning, which is in the McGovern Institute for Brain Research at MIT, as well as in the Dept. of Brain & Cognitive Sciences, and which is affiliated with the Computer Sciences & Artificial Intelligence Laboratory (CSAIL). This research was sponsored by grants from: Office of Naval Research (DARPA) Contract No. MDA972-04-1-0037, Office of Naval Research (DARPA) Contract No. N00014-02-1-0915, National Science Foundation (ITR/SYS) Contract No. IIS0112991, National Science Foundation (ITR) Contract No. IIS-0209289, National Science Foundation-NIH (CRCNS) Contract No. EIA-0218693, National Science Foundation-NIH (CRCNS) Contract No. EIA-0218506, and National Institutes of Health (Conte) Contract No. 1 P20 MH66239-01A1. Additional support was provided by: Central Research Institute of Electric Power Industry (CRIEPI), Daimler-Chrysler AG, Compaq/Digital Equipment Corporation, Eastman Kodak Company, Honda R&D Co., Ltd., Industrial Technology Research Institute (ITRI), Komatsu Ltd., Eugene McDermott Foundation, Merrill-Lynch, NEC Fund, Oxygen, Siemens Corporate Research, Inc., Sony, Sumitomo Metal Industries, and Toyota Motor Corporation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Functional Regularized Least Squares Classification with Operator-valued Kernels

Although operator-valued kernels have recently received increasing interest in various machine learning and functional data analysis problems such as multi-task learning or functional regression, little attention has been paid to the understanding of their associated feature spaces. In this paper, we explore the potential of adopting an operatorvalued kernel feature space perspective for the an...

متن کامل

Functional Regularized Least Squares Classication with Operator-valued Kernels

Although operator-valued kernels have recently received increasing interest in various machine learning and functional data analysis problems such as multi-task learning or functional regression, little attention has been paid to the understanding of their associated feature spaces. In this paper, we explore the potential of adopting an operatorvalued kernel feature space perspective for the an...

متن کامل

Reproducing Kernel Hilbert Spaces in Learning Theory: the Sphere and the Hypercube

We analyze the regularized least square algorithm in learning theory with Reproducing Kernel Hilbert Spaces (RKHS). Explicit convergence rates for the regression and binary classification problems are obtained in particular for the polynomial and Gaussian kernels on the n-dimensional sphere and the hypercube. There are two major ingredients in our approach: (i) a law of large numbers for Hilber...

متن کامل

Computer Science and Artificial Intelligence Laboratory Risk Bounds for Regularized Least-squares Algorithm with Operator-valued Kernels

We show that recent results in [3] on risk bounds for regularized least-squares on reproducing kernel Hilbert spaces can be straightforwardly extended to the vectorvalued regression setting. We first briefly introduce central concepts on operatorvalued kernels, then we show how risk bounds can be expressed in terms of a generalization of effective dimension. This report describes research done ...

متن کامل

Scalable Matrix-valued Kernel Learning and High-dimensional Nonlinear Causal Inference

We propose a general matrix-valued multiple kernel learning framework for highdimensional nonlinear multivariate regression problems. This framework allows a broad class of mixed norm regularizers, including those that induce sparsity, to be imposed on a dictionary of vector-valued Reproducing Kernel Hilbert Spaces [19]. We develop a highly scalable and eigendecomposition-free Block coordinate ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005